97 research outputs found

    Statistical emulation of a tsunami model for sensitivity analysis and uncertainty quantification

    Get PDF
    Due to the catastrophic consequences of tsunamis, early warnings need to be issued quickly in order to mitigate the hazard. Additionally, there is a need to represent the uncertainty in the predictions of tsunami characteristics corresponding to the uncertain trigger features (e.g. either position, shape and speed of a landslide, or sea floor deformation associated with an earthquake). Unfortunately, computer models are expensive to run. This leads to significant delays in predictions and makes the uncertainty quantification impractical. Statistical emulators run almost instantaneously and may represent well the outputs of the computer model. In this paper, we use the Outer Product Emulator to build a fast statistical surrogate of a landslide-generated tsunami computer model. This Bayesian framework enables us to build the emulator by combining prior knowledge of the computer model properties with a few carefully chosen model evaluations. The good performance of the emulator is validated using the Leave-One-Out method

    Statistical calibration of CFD modelling for street canyon flows

    Get PDF
    CFD simulations of complex outdoor environments present a significant modelling challenge. Simulations of airflow within an idealized street canyon are performed here. We test the model sensitivity to the empirical constants contained within the κ-ε turbulence model and examine how a systematic variation of these values could produce improved prediction of the turbulent kinetic energy when compared against wind tunnel data. The Bayesian statistical calibration shows the range of values the constants should take. This results in improved CFD simulations in the region of flow inside the street canyon, which is normally very difficult to resolve accurately in CFD models

    Statistical emulation of landslide-induced tsunamis at the Rockall Bank, NE Atlantic

    Get PDF
    Statistical methods constitute a useful approach to understand and quantify the uncertainty that governs complex tsunami mechanisms. Numerical experiments may often have a high computational cost. This forms a limiting factor for performing uncertainty and sensitivity analyses, where numerous simulations are required. Statistical emulators, as surrogates of these simulators, can provide predictions of the physical process in a much faster and computationally inexpensive way. They can form a prominent solution to explore thousands of scenarios that would be otherwise numerically expensive and difficult to achieve. In this work, we build a statistical emulator of the deterministic codes used to simulate submarine sliding and tsunami generation at the Rockall Bank, NE Atlantic Ocean, in two stages. First we calibrate, against observations of the landslide deposits, the parameters used in the landslide simulations. This calibration is performed under a Bayesian framework using Gaussian Process (GP) emulators to approximate the landslide model, and the discrepancy function between model and observations. Distributions of the calibrated input parameters are obtained as a result of the calibration. In a second step, a GP emulator is built to mimic the coupled landslide-tsunami numerical process. The emulator propagates the uncertainties in the distributions of the calibrated input parameters inferred from the first step to the outputs. As a result, a quantification of the uncertainty of the maximum free surface elevation at specified locations is obtained

    Faster Than Real Time Tsunami Warning with Associated Hazard Uncertainties

    Get PDF
    Tsunamis are unpredictable events and catastrophic in their potential for destruction of human lives and economy. The unpredictability of their occurrence poses a challenge to the tsunami community, as it is difficult to obtain from the tsunamigenic records estimates of recurrence rates and severity. Accurate and efficient mathematical/computational modeling is thus called upon to provide tsunami forecasts and hazard assessments. Compounding this challenge for warning centres is the physical nature of tsunamis, which can travel at extremely high speeds in the open ocean or be generated close to the shoreline. Thus, tsunami forecasts must be not only accurate but also delivered under severe time constraints. In the immediate aftermath of a tsunamigenic earthquake event, there are uncertainties in the source such as location, rupture geometry, depth, magnitude. Ideally, these uncertainties should be represented in a tsunami warning. However in practice, quantifying the uncertainties in the hazard intensity (i.e., maximum tsunami amplitude) due to the uncertainties in the source is not feasible, since it requires a large number of high resolution simulations. We approximate the functionally complex and computationally expensive high resolution tsunami simulations with a simple and cheap statistical emulator. A workflow integrating the entire chain of components from the tsunami source to quantification of hazard uncertainties is developed here - quantification of uncertainties in tsunamigenic earthquake sources, high resolution simulation of tsunami scenarios using the GPU version of Volna-OP2 on a non-uniform mesh for an ensemble of sources, construction of an emulator using the simulations as training data, and prediction of hazard intensities with associated uncertainties using the emulator. Thus, using the massively parallelized finite volume tsunami code Volna-OP2 as the heart of the workflow, we use statistical emulation to compute uncertainties in hazard intensity at locations of interest. Such an integration also balances the trade-off between computationally expensive simulations and desired accuracy of uncertainties, within given time constraints. The developed workflow is fully generic and independent of the source (1945 Makran earthquake) studied here

    Probabilistic, high-resolution tsunami predictions in northern Cascadia by exploiting sequential design for efficient emulation

    Get PDF
    The potential of a full-margin rupture along the Cascadia subduction zone poses a significant threat over a populous region of North America. Previous probabilistic tsunami hazard assessment studies produced hazard curves based on simulated predictions of tsunami waves, either at low resolution or at high resolution for a local area or under limited ranges of scenarios or at a high computational cost to generate hundreds of scenarios at high resolution. We use the graphics processing unit (GPU)-accelerated tsunami simulator VOLNA-OP2 with a detailed representation of topographic and bathymetric features. We replace the simulator by a Gaussian process emulator at each output location to overcome the large computational burden. The emulators are statistical approximations of the simulator's behaviour. We train the emulators on a set of input–output pairs and use them to generate approximate output values over a six-dimensional scenario parameter space, e.g. uplift/subsidence ratio and maximum uplift, that represent the seabed deformation. We implement an advanced sequential design algorithm for the optimal selection of only 60 simulations. The low cost of emulation provides for additional flexibility in the shape of the deformation, which we illustrate here considering two families – buried rupture and splay-faulting – of 2000 potential scenarios. This approach allows for the first emulation-accelerated computation of probabilistic tsunami hazard in the region of the city of Victoria, British Columbia

    Statistical diagnostic and correction of a chemistry-transport model for the prediction of total column ozone

    No full text
    International audienceIn this paper, we introduce a statistical method for examining and adjusting chemical-transport models. We illustrate the findings with total column ozone predictions, based on the University of Illinois at Urbana-Champaign 2-D (UIUC 2-D) chemical-transport model of the global atmosphere. We propose a general diagnostic procedure for the model outputs in total ozone over the latitudes ranging from 60° South to 60° North to see if the model captures some typical patterns in the data. The method proceeds in two steps to avoid possible collinearity issues. First, we regress the measurements given by a cohesive data set from the SBUV(/2) satellite system on the model outputs with an autoregressive noise component. Second, we regress the residuals of this first regression on the solar flux, the annual cycle, the Antarctic or Arctic Oscillation, and the Quasi Biennial Oscillation. If the coefficients from this second regression are statistically significant, then they mean that the model did not simulate properly the pattern associated with these factors. Systematic anomalies of the model are identified using data from 1979 to 1995, and statistically corrected afterwards. The 1996?2003 validation sample confirms that the combined approach yields better predictions than the direct UIUC 2-D outputs

    Global ensemble of temperatures over 1850-2018: quantification of uncertainties in observations, coverage, and spatial modeling (GETQUOCS)

    Get PDF
    Instrumental global temperature records are derived from the network of in situ measurements of land and sea surface temperatures. This observational evidence is seen as being fundamental to climate science. Therefore, the accuracy of these measurements is of prime importance for the analysis of temperature variability. There are spatial gaps in the distribution of instrumental temperature measurements across the globe. This lack of spatial coverage introduces coverage error. An approximate Bayesian computation based multi-resolution lattice kriging is developed and used to quantify the coverage errors through the variance of the spatial process at multiple spatial scales. It critically accounts for the uncertainties in the parameters of this advanced spatial statistics model itself, thereby providing, for the first time, a full description of both the spatial coverage uncertainties along with the uncertainties in the modeling of these spatial gaps. These coverage errors are combined with the existing estimates of uncertainties due to observational issues at each station location. It results in an ensemble of 100 000 monthly temperatures fields over the entire globe that samples the combination of coverage, parametric and observational uncertainties from 1850 to 2018 over a 5∘×5∘ grid

    Probabilistic quantification of tsunami current hazard using statistical emulation

    Get PDF
    © 2021 The Authors. In this paper, statistical emulation is shown to be an essential tool for the end-to-end physical and numerical modelling of local tsunami impact, i.e. from the earthquake source to tsunami velocities and heights. In order to surmount the prohibitive computational cost of running a large number of simulations, the emulator, constructed using 300 training simulations from a validated tsunami code, yields 1 million predictions. This constitutes a record for any realistic tsunami code to date, and is a leap in tsunami science since high risk but low probability hazard thresholds can be quantified. For illustrating the efficacy of emulation, we map probabilistic representations of maximum tsunami velocities and heights at around 200 locations about Karachi port. The 1 million predictions comprehensively sweep through a range of possible future tsunamis originating from the Makran Subduction Zone (MSZ). We rigorously model each step in the tsunami life cycle: first use of the three-dimensional subduction geometry Slab2 in MSZ, most refined fault segmentation in MSZ, first sediment enhancements of seabed deformation (up to 60% locally) and bespoke unstructured meshing algorithm. Owing to the synthesis of emulation and meticulous numerical modelling, we also discover substantial local variations of currents and heights.Alan Turing Institute under the EPSRCgrant no. (EP/N510129/1); Royal Society grant no. (CHL/R1/180173); Royal Society-SERB Newton International Fellowship(NF151483); NERC grant no. (NE/P016367/1)

    Expression of the bacterial type III effector DspA/E in Saccharomyces cerevisiae downregulates the sphingolipid biosynthetic pathway leading to growth-arrest

    Get PDF
    Erwinia amylovora, the bacterium responsible for fire blight, relies on a type III secretion system and a single injected effector, DspA/E, to induce disease in host plants. DspA/E belongs to the widespread AvrE family of type III effectors which suppress plant defense responses and promote bacterial growth followinginfection. Ectopic expression of DspA/E in plant or in Saccharomyces cerevisiae is toxic indicating that DspA/E likely targets a cellular process conserved between yeast and plant. To unravel the mode of action of DspA/E, we screened the Euroscarf, S. cerevisiae library for mutants resistant toDspA/E-induced growth arrest. The most resistant mutants (Δsur4, Δfen1, Δipt1,Δskn1, Δcsg1, Δcsg2, Δorm1, Δorm2) were impaired in the sphingolipid biosynthetic pathway. Exogenously supplied sphingolipid precursors such as the long chain bases(LCBs) phytosphingosine and dihydrosphingosine also suppressed DspA/E-induced yeast growth defect. Expression of DspA/E in yeast downregulated LCBs biosynthesis and induced a rapid decrease in LCB levels,indicating that SPT, the first and rate limiting enzyme of the sphingolipid biosynthetic pathway was repressed. SPT downregulation was mediated by dephosphorylation and activation of Orm proteins that negatively regulate SPT. A Δcdc55 mutation, affecting Cdc55-PP2A protein phosphatase activity, prevented Orm dephosphorylation and suppressed DspA/E-induced growth arrest
    • …
    corecore